|
..
|
|
|
1 - mini-batch gradient descent
|
2020-03-21T18:30:37.094792Z
|
|
10 - the problem of local optima
|
2020-03-21T18:30:37.095792Z
|
|
2 - understanding mini-batch gradient descent
|
2020-03-21T18:30:37.097792Z
|
|
3 - exponentially weighted averages
|
2020-03-21T18:30:37.098792Z
|
|
4 - understanding exponentially weighted averages
|
2020-03-21T18:30:37.101792Z
|
|
6 - gradient descent with momentum
|
2020-03-21T18:30:37.103792Z
|
|
7 - RMSprop
|
2020-03-21T18:30:37.105792Z
|
|
8 - Adam optimization algorithm
|
2020-03-21T18:30:37.105792Z
|
|
9 - learning rate decay
|
2020-03-21T18:30:37.107792Z
|